Future of law: UK’s SRA takes unprecedented approach in authorising AI-enabled law firms
Debbie ThomasMonday 20 April 2026
In February, the Solicitors Regulation Authority (SRA) of England and Wales authorised LawFairy, which delivers legal services entirely through AI, to practise as a law firm. The SRA’s decision follows the authorisation of Garfield.Law in spring 2025 and shows the regulatory body taking a unique approach to the emergence of AI-enabled firms, one not currently mirrored elsewhere in the world.
LawFairy focuses on immigration, analysing visa eligibility, sponsorship and citizenship routes, while Garfield.Law assists businesses in England in pursuing small-debt claims of up to £10,000. Daniel Lundqvist, an officer of the IBA Technology Law Committee, explains that the SRA permits AI-led delivery only for ‘narrow, standardised areas of law’ and not as a way of ‘endorsing AI as a replacement for lawyers more broadly.’
Still, the SRA’s approach represents a ‘shift’ in the meaning of what a law firm is, says Sönke Lund, Chair of the IBA SPPI Working Group on AI. Lund doesn’t believe the traditional law firm model is on the verge of being displaced by the emergence of the likes of LawFairy. However, he says it faces competitive pressure in commoditised and rule-based practice areas, and a challenge to how it sees itself professionally. ‘The lawyer as [an] irreplaceable human intermediary is no longer a regulatory assumption,’ says Lund.
No other jurisdiction has yet followed the SRA’s approach in authorising AI-enabled firms. Pranav Srivastava, Co-Chair of the IBA Young Lawyers’ Committee, says that India, for example, bars non-lawyers from owning law firms and prefers AI adoption to be led by legal professionals themselves.
The lawyer as [an] irreplaceable human intermediary is no longer a regulatory assumption
Sönke Lund
Chair, IBA SPPI Working Group on AI
There are, however, reforms taking place in the US to allow for new types of legal service providers and business structures, including technology-enabled models in states such as Utah and Arizona.
Meanwhile, certain Canadian provinces have implemented ‘sandboxes’, enabling innovative legal services to be tested in a controlled environment, says Srivastava, a partner at Phoenix Legal in New Delhi. And the EU’s AI Act requires each Member State to establish at least one regulatory sandbox to develop, test and validate AI systems before they enter the market.
Announcing the authorisation of Garfield.Law, then-SRA Chief Executive, Paul Philip, highlighted the novel risks posed by an AI-driven law firm model. He said the SRA had ‘worked closely with this firm to make sure it can meet our rules, and all the appropriate protections are in place.’ Regarding LawFairy, an SRA spokesperson said the firm ‘was authorised in the same way we authorise any law firm we regulate.’
Philip Young, Co-Founder and CEO of Garfield.Law, says his team have been mindful of the risks of ‘so-called AI hallucinations.’ They mitigate these, he says, by operating ‘a hybrid of a deterministic expert system with a probabilistic [large language model] LLM-powered system’ in which the incidence of hallucinations is dramatically reduced because the deterministic system has overall control. Two other measures are in place – users review and approve all documents generated by Garfield before they’re sent out, and where LLMs are used to create documents, they’re checked by a lawyer before being issued, says Young.
LawFairy’s model, meanwhile, offers what Lund – a partner at ECIJA – describes as ‘reproducibility and traceability that human practitioners cannot guarantee.’ He does however warn of a few gaps in such a model. For example, laws aren’t always fully deterministic, while there’s a risk of ‘the failure to recognise that a fact pattern engages an exception or discretionary element that the encoded rules do not capture’. Lund also highlights concerns about ‘whether existing national complaints and redress mechanisms are well-calibrated or prepared for disputes arising from technology-only delivery, where no individual lawyer exercised judgement on the matter in question.’
Lund describes LawFairy as ‘a regulatory test case’ that ‘asks whether the protections traditionally secured through human professional judgement can be replicated or indeed improved upon through disciplined systems design, auditability and structural accountability.’
Sebastian Jenks, Head of Commercial at LawFairy, explains that in the field the firm operates within, a large proportion of cases are rule-based, with eligibility turning on objective criteria as set out in the immigration rules. ‘It’s where deterministic systems are both appropriate and effective,’ he says.
The remainder of cases are evaluative and include credibility assessments, those involving proportionality under Article 8 of the European Convention on Human Rights or are protection claims. ‘The [LawFairy] system structures the evidence, flagging them for human input,’ says Jenks. ‘The approach is not to force everything into automation, but to apply determinism where the law is genuinely rule-based, and to support, rather than replace, human judgement where it is evaluative.’
Lundqvist, a partner at Advokatfirman Kahn Pedersen in Stockholm, predicts the emergence of a split market in which AI providers handle routine, high-volume work and ‘traditional and specialised firms’ focus on complex, judgement-heavy matters.
Srivastava says that while Garfield.Law and LawFairy currently operate in very narrow areas, their authorisation could mean that over time ‘most standardisable legal work will be handled by similar technology-led models.’ Even if such firms don’t proliferate, clients may increasingly use AI tools directly, meaning some work that would traditionally have gone to law firms may never reach them, ‘particularly where cost sensitivity is high and the legal task is predictable,’ he adds.
‘The implication for law firms is not that the profession disappears, but that the volume of routine, process-driven work will reduce, and with it, a key source of steady, billable hours,’ says Srivastava. ‘This will inevitably put pressure on traditional hourly billing models, especially in areas where clients can access faster and cheaper alternatives.’
He sees law firms needing to move up the value chain by offering services that AI can’t readily replicate, such as ‘judgement and nuanced legal analysis, strategic thinking and advisory, complex problem-solving and negotiation, and context-driven decision-making.’
Lundqvist adds that the inherently human features of trust, reassurance and advice-giving are difficult to automate. ‘In my experience,’ he says, ‘legal practice is not just process; it involves experience, interpretation and client relationships.’ On this point, Jenks says that LawFairy users ‘can bring in a regulated adviser at any stage with a single click,’ and the system also flags when there’s a need for specialist input.
Young, however, takes a different position. At Garfield.Law, he explains, their multi-faceted approach and investment in LLM outputs mean it’s comparatively rare for a human lawyer to adjust something Garfield has done. ‘Thus we are rapidly reaching the point where the human-in-the-loop might be more a risk factor than a benefit,’ he says.
Header image: InfiniteFlow/Adobe Stock